On the equivalence of two-layered perceptrons with binary neurons

نویسندگان

  • Marcelo Blatt
  • Eytan Domany
  • Ido Kanter
چکیده

We consider two-layered perceptions consisting of N binary input units, K binary hidden units and one binary output unit, in the limit N >> K > or = 1. We prove that the weights of a regular irreducible network are uniquely determined by its input-output map up to some obvious global symmetries. A network is regular if its K weight vectors from the input layer to the K hidden units are linearly independent. A (single layered) perceptron is said to be irreducible if its output depends on every one of its input units; and a two-layered perceptron is irreducible if the K + 1 perceptrons that constitute such network are irreducible. By global symmetries we mean, for instance, permuting the labels of the hidden units. Hence, two irreducible regular two-layered perceptrons that implement the same Boolean function must have the same number of hidden units, and must be composed of equivalent perceptrons.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Design and Complexity of Exact Multilayered Perceptrons

We investigate the network complexity of multi-layered perceptrons for solving exactly a given problem. We limit our study to the class of combinatorial optimization problems. It is shown how these problems can be reformulated as binary classification problems and how they can be solved by multi-layered perceptrons.

متن کامل

Memorandum CaSaR 92 - 25 Exact Classification With Two - Layered Perceptrons

We study the capabilities of two-layered perceptrons for classifying exactly a given subset. Both necessary and sufficient conditions are derived for subsets to be exactly classifiable with two-layered perceptrons that use the hard-limiting response function. The necessary conditions can be viewed as generalizations of the linear-separability condition of one-layered perceptrons and confirm the...

متن کامل

Fuzzy number-valued fuzzy ‎relation

It is well known fact that binary relations are generalized mathematical functions. Contrary to functions from domain to range, binary relations may assign to each element of domain two or more elements of range. Some basic operations on functions such as the inverse and composition are applicable to binary relations as well. Depending on the domain or range or both are fuzzy value fuzzy set, i...

متن کامل

Topological structure on generalized approximation space related to n-arry relation

Classical structure of rough set theory was first formulated by Z. Pawlak in [6]. The foundation of its object classification is an equivalence binary relation and equivalence classes. The upper and lower approximation operations are two core notions in rough set theory. They can also be seenas a closure operator and an interior operator of the topology induced by an equivalence relation on a u...

متن کامل

Classification using multi-layered perceptrons

There has been an increasing interest in the applicability of neural networks in disparate domains. In this paper, we describe the use of multi-layered perceptrons, a type of neural network topology, for financial classification problems, with promising results. Backpropagation, which is the learning algorithm most often used in multilayered perceptrons, however, is inherently an inefficient se...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • International journal of neural systems

دوره 6 3  شماره 

صفحات  -

تاریخ انتشار 1995